Regularization in relevance learning vector quantization using l1-norms

نویسندگان

  • Martin Riedel
  • Fabrice Rossi
  • Marika Kaden
  • Thomas Villmann
چکیده

We propose in this contribution a method for l1-regularization in prototype based relevance learning vector quantization (LVQ) for sparse relevance pro les. Sparse relevance pro les in hyperspectral data analysis fade down those spectral bands which are not necessary for classi cation. In particular, we consider the sparsity in the relevance pro le enforced by LASSO optimization. The latter one is obtained by a gradient learning scheme using a di erentiable parametrized approximation of the l1-norm, which has an upper error bound. We extend this regularization idea also to the matrix learning variant of LVQ as the natural generalization of relevance learning.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularization in Relevance Learning Vector Quantization Using l one Norms

We propose in this contribution a method for l1-regularization in prototype based relevance learning vector quantization (LVQ) for sparse relevance profiles. Sparse relevance profiles in hyperspectral data analysis fade down those spectral bands which are not necessary for classification. In particular, we consider the sparsity in the relevance profile enforced by LASSO optimization. The latter...

متن کامل

Stationarity of Matrix Relevance Learning Vector Quantization

We investigate the convergence properties of heuristic matrix relevance updates in Learning Vector Quantization. Under mild assumptions on the training process, stationarity conditions can be worked out which characterize the outcome of training in terms of the relevance matrix. It is shown that the original training schemes single out one specific direction in feature space which depends on th...

متن کامل

Sparse Functional Relevance Learning in Generalized Learning Vector Quantization

Relevance learning in learning vector quantization is a central paradigm for classi cation task depending feature weighting and selection. We propose a functional approach to relevance learning for highdimensional functional data. For this purpose we compose the relevance pro le by a superposition of only a few parametrized basis functions taking into account the functional character of the dat...

متن کامل

Regularization in matrix learning

We present a regularization technique to extend recently proposed matrix learning schemes in Learning Vector Quantization (LVQ). These learning algorithms extend the concept of adaptive distance measures in LVQ to the use of relevance matrices. In general, metric learning can display a tendency towards over-simplification in the course of training. An overly pronounced elimination of dimensions...

متن کامل

Applications of lp-Norms and their Smooth Approximations for Gradient Based Learning Vector Quantization

Learning vector quantization applying non-standard metrics became quite popular for classification performance improvement compared to standard approaches using the Euclidean distance. Kernel metrics and quadratic forms belong to the most promising approaches. In this paper we consider Minkowski distances (lp-norms). In particular, l1-norms are known to be robust against noise in data, such tha...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013